Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
نویسنده
چکیده
Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training—typically carried out by (quasi-)Newton methods—is rather timeconsuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by orders of magnitude, with comparable classification accuracy.
منابع مشابه
Least-squares Probabilistic Classifier: a Computationally Efficient Alternative to Kernel Logistic Regression
The least-squares probabilistic classifier (LSPC) is a computationally efficient alternative to kernel logistic regression (KLR). A key idea for the speedup is that, unlike KLR that uses maximum likelihood estimation for a log-linear model, LSPC uses least-squares estimation for a linear model. This allows us to obtain a global solution analytically in a classwise manner. In exchange for the sp...
متن کاملComputationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
Probabilistic classification and multi-task learning are two important branches of machine learning research. Probabilistic classification is useful when the ‘confidence’ of decision is necessary. On the other hand, the idea of multi-task learning is beneficial if multiple related learning tasks exist. So far, kernelized logistic regression has been a vital probabilistic classifier for the use ...
متن کاملComputationally Efficient Multi-Label Classification by Least-Squares Probabilistic Classifiers
Multi-label classification allows a sample to belong to multiple classes simultaneously, which is often the case in real-world applications such as text categorization and image annotation. In multi-label scenarios, taking into account correlations among multiple labels can boost the classification accuracy. However, this makes classifier training more challenging because handling multiple labe...
متن کاملMAPLSC: A novel multi-class classifier for medical diagnosis
Analysis of clinical records contributes to the Traditional Chinese Medicine (TCM) experience expansion and techniques promotion. More than two diagnostic classes (diagnostic syndromes) in the clinical records raise a popular data mining problem: multi-value classification. In this paper, we propose a novel multi-class classifier, named Multiple Asymmetric Partial Least Squares Classifier (MAPL...
متن کاملMulticlass Least Squares Twin Support Vector Machine for Pattern Classification
This paper proposes a Multiclass Least Squares Twin Support Vector Machine (MLSTSVM) classifier for multi-class classification problems. The formulation of MLSTSVM is obtained by extending the formulation of recently proposed binary Least Squares Twin Support Vector Machine (LSTSVM) classifier. For M-class classification problem, the proposed classifier seeks M-non parallel hyper-planes, one fo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEICE Transactions
دوره 93-D شماره
صفحات -
تاریخ انتشار 2010